249 research outputs found

    Beyond Geometry : Towards Fully Realistic Wireless Models

    Full text link
    Signal-strength models of wireless communications capture the gradual fading of signals and the additivity of interference. As such, they are closer to reality than other models. However, nearly all theoretic work in the SINR model depends on the assumption of smooth geometric decay, one that is true in free space but is far off in actual environments. The challenge is to model realistic environments, including walls, obstacles, reflections and anisotropic antennas, without making the models algorithmically impractical or analytically intractable. We present a simple solution that allows the modeling of arbitrary static situations by moving from geometry to arbitrary decay spaces. The complexity of a setting is captured by a metricity parameter Z that indicates how far the decay space is from satisfying the triangular inequality. All results that hold in the SINR model in general metrics carry over to decay spaces, with the resulting time complexity and approximation depending on Z in the same way that the original results depends on the path loss term alpha. For distributed algorithms, that to date have appeared to necessarily depend on the planarity, we indicate how they can be adapted to arbitrary decay spaces. Finally, we explore the dependence on Z in the approximability of core problems. In particular, we observe that the capacity maximization problem has exponential upper and lower bounds in terms of Z in general decay spaces. In Euclidean metrics and related growth-bounded decay spaces, the performance depends on the exact metricity definition, with a polynomial upper bound in terms of Z, but an exponential lower bound in terms of a variant parameter phi. On the plane, the upper bound result actually yields the first approximation of a capacity-type SINR problem that is subexponential in alpha

    Fast algorithms for the Tron game on trees

    Get PDF
    Abstract TR.ON is the following game which can be played on any graph: Two players choose alternately a node of the graph subject to the requirement that each player must choose a node which is adjacent to his previously chosen node and such that every node is chosen only once. In this paper O(n) and O(ny'n) algorithms are given for deciding whether there is a winning strategy for the first player when TR.oN is played on a given tree, for the variants with and without specified starting nodes, respectively. The problem is shown to be both NP-hard and co-NP-hard for connected undirected graphs in general

    Crossing Paths with Hans Bodlaender:A Personal View on Cross-Composition for Sparsification Lower Bounds

    Get PDF
    On the occasion of Hans Bodlaender’s 60th birthday, I give a personal account of our history and work together on the technique of cross-composition for kernelization lower bounds. I present several simple new proofs for polynomial kernelization lower bounds using cross-composition, interlaced with personal anecdotes about my time as Hans’ PhD student at Utrecht University. Concretely, I will prove that Vertex Cover, Feedback Vertex Set, and the H-Factor problem for every graph H that has a connected component of at least three vertices, do not admit kernels of (formula presented) bits when parameterized by the number of vertices n for any (formula presented), unless (formula presented). These lower bounds are obtained by elementary gadget constructions, in particular avoiding the use of the Packing Lemma by Dell and van Melkebeek.</p

    On the (non-)existence of polynomial kernels for Pl-free edge modification problems

    Full text link
    Given a graph G = (V,E) and an integer k, an edge modification problem for a graph property P consists in deciding whether there exists a set of edges F of size at most k such that the graph H = (V,E \vartriangle F) satisfies the property P. In the P edge-completion problem, the set F of edges is constrained to be disjoint from E; in the P edge-deletion problem, F is a subset of E; no constraint is imposed on F in the P edge-edition problem. A number of optimization problems can be expressed in terms of graph modification problems which have been extensively studied in the context of parameterized complexity. When parameterized by the size k of the edge set F, it has been proved that if P is an hereditary property characterized by a finite set of forbidden induced subgraphs, then the three P edge-modification problems are FPT. It was then natural to ask whether these problems also admit a polynomial size kernel. Using recent lower bound techniques, Kratsch and Wahlstrom answered this question negatively. However, the problem remains open on many natural graph classes characterized by forbidden induced subgraphs. Kratsch and Wahlstrom asked whether the result holds when the forbidden subgraphs are paths or cycles and pointed out that the problem is already open in the case of P4-free graphs (i.e. cographs). This paper provides positive and negative results in that line of research. We prove that parameterized cograph edge modification problems have cubic vertex kernels whereas polynomial kernels are unlikely to exist for the Pl-free and Cl-free edge-deletion problems for large enough l

    Faster Algorithms for Algebraic Path Properties in Recursive State Machines with Constant Treewidth

    Get PDF
    Interprocedural analysis is at the heart of numerous applications in programming languages, such as alias analysis, constant propagation, etc. Recursive state machines (RSMs) are standard models for interprocedural analysis. We consider a general framework with RSMs where the transitions are labeled from a semiring, and path properties are algebraic with semiring operations. RSMs with algebraic path properties can model interprocedural dataflow analysis problems, the shortest path problem, the most probable path problem, etc. The traditional algorithms for interprocedural analysis focus on path properties where the starting point is fixed as the entry point of a specific method. In this work, we consider possible multiple queries as required in many applications such as in alias analysis. The study of multiple queries allows us to bring in a very important algorithmic distinction between the resource usage of the one-time preprocessing vs for each individual query. The second aspect that we consider is that the control flow graphs for most programs have constant treewidth. Our main contributions are simple and implementable algorithms that support multiple queries for algebraic path properties for RSMs that have constant treewidth. Our theoretical results show that our algorithms have small additional one-time preprocessing, but can answer subsequent queries significantly faster as compared to the current best-known solutions for several important problems, such as interprocedural reachability and shortest path. We provide a prototype implementation for interprocedural reachability and intraprocedural shortest path that gives a significant speed-up on several benchmarks

    Compact Labelings For Efficient First-Order Model-Checking

    Get PDF
    We consider graph properties that can be checked from labels, i.e., bit sequences, of logarithmic length attached to vertices. We prove that there exists such a labeling for checking a first-order formula with free set variables in the graphs of every class that is \emph{nicely locally cwd-decomposable}. This notion generalizes that of a \emph{nicely locally tree-decomposable} class. The graphs of such classes can be covered by graphs of bounded \emph{clique-width} with limited overlaps. We also consider such labelings for \emph{bounded} first-order formulas on graph classes of \emph{bounded expansion}. Some of these results are extended to counting queries

    Linear-time Algorithms for Eliminating Claws in Graphs

    Full text link
    Since many NP-complete graph problems have been shown polynomial-time solvable when restricted to claw-free graphs, we study the problem of determining the distance of a given graph to a claw-free graph, considering vertex elimination as measure. CLAW-FREE VERTEX DELETION (CFVD) consists of determining the minimum number of vertices to be removed from a graph such that the resulting graph is claw-free. Although CFVD is NP-complete in general and recognizing claw-free graphs is still a challenge, where the current best algorithm for a graph GG has the same running time of the best algorithm for matrix multiplication, we present linear-time algorithms for CFVD on weighted block graphs and weighted graphs with bounded treewidth. Furthermore, we show that this problem can be solved in linear time by a simpler algorithm on forests, and we determine the exact values for full kk-ary trees. On the other hand, we show that CLAW-FREE VERTEX DELETION is NP-complete even when the input graph is a split graph. We also show that the problem is hard to approximate within any constant factor better than 22, assuming the Unique Games Conjecture.Comment: 20 page

    Tree decompositions with small cost

    Get PDF
    The f-cost of a tree decomposition ({Xi | i e I}, T = (I;F)) for a function f : N -> R+ is defined as EieI f(|Xi|). This measure associates with the running time or memory use of some algorithms that use the tree decomposition. In this paper we investigate the problem to find tree decompositions of minimum f-cost. A function f : N -> R+ is fast, if for every i e N: f(i+1) => 2*f(i). We show that for fast functions f, every graph G has a tree decomposition of minimum f-cost that corresponds to a minimal triangulation of G; if f is not fast, this does not hold. We give polynomial time algorithms for the problem, assuming f is a fast function, for graphs that has a polynomial number of minimal separators, for graphs of treewidth at most two, and for cographs, and show that the problem is NP-hard for bipartite graphs and for cobipartite graphs. We also discuss results for a weighted variant of the problem derived of an application from probabilistic networks

    Vertex Cover Kernelization Revisited: Upper and Lower Bounds for a Refined Parameter

    Get PDF
    An important result in the study of polynomial-time preprocessing shows that there is an algorithm which given an instance (G,k) of Vertex Cover outputs an equivalent instance (G',k') in polynomial time with the guarantee that G' has at most 2k' vertices (and thus O((k')^2) edges) with k' <= k. Using the terminology of parameterized complexity we say that k-Vertex Cover has a kernel with 2k vertices. There is complexity-theoretic evidence that both 2k vertices and Theta(k^2) edges are optimal for the kernel size. In this paper we consider the Vertex Cover problem with a different parameter, the size fvs(G) of a minimum feedback vertex set for G. This refined parameter is structurally smaller than the parameter k associated to the vertex covering number vc(G) since fvs(G) <= vc(G) and the difference can be arbitrarily large. We give a kernel for Vertex Cover with a number of vertices that is cubic in fvs(G): an instance (G,X,k) of Vertex Cover, where X is a feedback vertex set for G, can be transformed in polynomial time into an equivalent instance (G',X',k') such that |V(G')| <= 2k and |V(G')| <= O(|X'|^3). A similar result holds when the feedback vertex set X is not given along with the input. In sharp contrast we show that the Weighted Vertex Cover problem does not have a polynomial kernel when parameterized by the cardinality of a given vertex cover of the graph unless NP is in coNP/poly and the polynomial hierarchy collapses to the third level.Comment: Published in "Theory of Computing Systems" as an Open Access publicatio
    corecore